Learning Two-Layer Contractive Encodings

نویسندگان

  • Hannes Schulz
  • Sven Behnke
چکیده

Unsupervised learning of feature hierarchies is often a good initialization for supervised training of deep architectures. In existing deep learning methods, these feature hierarchies are built layer by layer in a greedy fashion using auto-encoders or restricted Boltzmann machines. Both yield encoders, which compute linear projections followed by a smooth thresholding function. In this work, we demonstrate that these encoders fail to find stable features when the required computation is in the exclusive-or class. To overcome this limitation, we propose a two-layer encoder which is not restricted in the type of features it can learn. The proposed encoder can be regularized by an extension of previous work on contractive regularization. We demonstrate the advantages of two-layer encoders qualitatively, as well as on commonly used benchmark datasets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Two-Layer Contractive Encodings with Linear Transformation of Perceptrons for Semi-Supervised Learning

It is difficult to train a multi-layer perceptron (MLP) when there are only a few labeled samples available. However, by pretraining an MLP with vast amount of unlabeled samples available, we may achieve better generalization performance. Schulz et al. (2012) showed that it is possible to pretrain an MLP in a less greedy way by utilizing the two-layer contractive encodings, however, with a cost...

متن کامل

Two-Layer Contractive Encodings with Shortcuts for Semi-supervised Learning

Supervised training of multi-layer perceptrons (MLP) with only few labeled examples is prone to overfitting. Pretraining an MLP with unlabeled samples of the input distribution may achieve better generalization. Usually, pretraining is done in a layer-wise, greedy fashion which limits the complexity of the learnable features. To overcome this limitation, two-layer contractive encodings have bee...

متن کامل

Two-layer contractive encodings for learning stable nonlinear features

Unsupervised learning of feature hierarchies is often a good strategy to initialize deep architectures for supervised learning. Most existing deep learning methods build these feature hierarchies layer by layer in a greedy fashion using either auto-encoders or restricted Boltzmann machines. Both yield encoders which compute linear projections of input followed by a smooth thresholding function....

متن کامل

Deep Tensor Encodings

Learning an encoding of feature vectors in terms of an over-complete dictionary or a information geometric (Fisher vectors) construct is wide-spread in statistical signal processing and computer vision. In content based information retrieval using deep-learning classi€ers, such encodings are learnt on the ƒaŠened last layer, without adherence to the multi-linear structure of the underlying feat...

متن کامل

Traffic and Quality Characterization of Scalable Encoded Video: A Large-Scale Trace-Based Study Part 4: Statistical Analysis of Spatial Scalable Encoded Video

In this part we study the traffic and quality characteristics of spatial scalable encoded videos. Six video sequences representing different genres were spatially encoded into two layers, the base layer, and the enhancement layer. Each sequences are of CIF type and 30 minutes in length. We studied the traffic and quality characteristics of base layer (QCIF), the enhancement layer (traffic only)...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012